1. List all databases in MySQL database
Sqoop list-databases--connect jdbc:mysql://localhost:3306/-username Dyh-password 000000
2. Connect MySQL and list the tables in the database
Sqoop list-tables--connect jdbc:mysql://localhost:3306/test--username dyh--password 000000
3. Copy the table structure of the relational data into hive
Sqoop create-hive-table--connect
Use
1 MySQL import data to HDFs
1.1./sqoop Import--connect jdbc:mysql://192.168.116.132:3306/sqoop--username root--password 123456--table test_user --target-dir/sqoop/test_user-m 2--fields-terminated-by "\ T"--columns "id,name"--where ' id>2 and Id
--connect Connection Database
--username Users
--password Password
--table Table Name
--target-dir Target direc
record Te Rminator characters.Sqoop also supports different data formats for importing data. For example, you can easily import data in Avro data format by simply specifying the option--as-avrodatafile with the Imp ORT command.There is many other options that Sqoop provides which can is used to further tune the import operation to suit your speci FIC requirements.Importing Data into HiveIn very cases, importing data into Hive are the same as running
command, which gets the schema of the relational database and establishes a mapping between the Hadoop field and the database table fields. The input commands are then converted into map-based mapreduce jobs, so that there are many map tasks in the MapReduce job that read the data in parallel from HDFS and copy the entire data into the database. Let's take a look at how Sqoop uses the command line to ex
Tags: hive videoHive detailed and practical (hive Environment deployment +zeus+sqoop sqoop+ User Behavior analysis case)Course Study Address: http://www.xuetuwuyou.com/course/187The course out of self-study, worry-free network: http://www.xuetuwuyou.comCourse Description:This course introduces basic hive architecture and environment deployment, and leads you to understand the advantages of data Warehouse hi
/15 08:15:36 INFO hive.HiveImport: Loading uploaded data into Hive13/09/15 08:15:36 INFO manager.MySQLManager: Executing SQL statement: SELECT t.* FROM `test` AS t LIMIT 113/09/15 08:15:36 INFO manager.MySQLManager: Executing SQL statement: SELECT t.* FROM `test` AS t LIMIT 113/09/15 08:15:41 INFO hive.HiveImport: Logging initialized using configuration in jar:file:/home/hadoop/hive-0.10.0/lib/hive-common-0.10.0.jar!/hive-log4j.properties13/09/15 08:15:41 INFO hive.HiveImport: Hive history file=
.
Hiveimport:hive import complete.
Iii. Sqoop Order
Sqoop has about 13 commands, and several common parameters (all of which support these 13 commands), and here are the 13 commands listed first.It then lists the various common parameters for
Installation of 1.sqoop1.1 Integration with Hadoop and hive, modifying the/opt/cdh/sqoop-1.4.5-cdh5.3.6/conf/sqoop-env.sh file 1.2 Verifying that the installation is successful Bin/sqoop version view Sqoop versions 2.sqoop Basic Operation2.1 View
from your local computer ' s file system then Y OU should use any of the tools discussed in this article. The same article also discusses how to import data to HDFS from SQL Database/sql Server using Sqoop. In this blog I'll elaborate on the same with a example and try to provide more details information along the.What does I need to does for Sqoop to work in my HDInsight cluster?HDInsight 2.1 includes
$ sqoop Help
Usage:sqoop COMMAND [ARGS]
Available commands:
CodeGen Generate code to interact with database records
Create-hive-table Import A table definition into hive
Eval Evaluate A SQL statement and display the results
Export Export an HDFS directory to a database table
Help List available
Business BackgroundUse Sqoop to query, add and delete MySQL.Business ImplementationSelect operation:sqoop eval --connect jdbc:mysql://127.0.0.1:3306/market --username admin --password 123456 --query "select end_user_id, category_id, score, last_bought_date, days_left, update_time The results of the implementation are as follows:[[email protected]/home/pms/workspace/ouyangyewei/data] $sqoop eval >--connect j
Tags: Big Data eraSqoop as a Hadoop The bridge between the traditional database and the data import and export plays an important role. by expounding the basic grammar and function of Sqoop, the paper deeply decrypts the function and value of Sqoop. First, what is Apache Sqoop?Clouderadeveloped byApacheOpen Source project, isSql-to-hadoopthe abbreviation. Main
Sqoop importing MySQL data sheet to hive error[[Email protected]172- +-1-221lib]# sqoop Import--connect jdbc:mysql://54.223.175.12:3308/gxt3--username guesttest--password guesttest--table ecomaccessv3-m 1--hive-importWarning:/opt/cloudera/parcels/cdh-5.10.0-1. Cdh5.10.0. P0. A/bin/. /lib/sqoop/. /accumulo does not exist!Accumulo imports would fail. pleaseSet$ACCU
Sqoop import data time date type error, sqoop import data date
A problem has been plagued for a long time. When sqoop import is used to import data from a mysql database to HDFS, an error is reported until an invalid value of the time and date type is found.
Hive only supports the timestamp type, while the date type in mysql is datetime. When the datetime value
1‘.
To avoid this, sqoop does not allow you to specify multiple map tasks and only allow '-M 1'. That is, the import/export operations must be executed in a serial mode.2) specify -- split-by and select fields suitable for splitting.The -- split-by field is applicable to the import/export of table data without a primary key. Its parameters are used with -- num-mapper.
3) Split multiple sqoop
fast access to data that is different from JDBC and can be used by--direct.
The following are based on sqoop-1.4.3
installationSqoop installation can refer to http://www.54chen.com/java-ee/sqoop-mysql-to-hive.html, test work
ToolsSqoop contains a series of tools that run Sqoop help to see the relevant
$./sqoop
HIVE_HOME =/home/hadoop/hive-0.8.1At this time, we can perform the test. We primarily use hive for interaction. Actually, we submit data from a relational database to hive and save it to HDFS for big data computing.
Sqoop mainly includes the following commands or functions.
Codegen Import a table definition into Hive eval Evaluate a SQL statement and display the results export Export an HDFS directory to a
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.